439 research outputs found
Implementing and reasoning about hash-consed data structures in Coq
We report on four different approaches to implementing hash-consing in Coq
programs. The use cases include execution inside Coq, or execution of the
extracted OCaml code. We explore the different trade-offs between faithful use
of pristine extracted code, and code that is fine-tuned to make use of OCaml
programming constructs not available in Coq. We discuss the possible
consequences in terms of performances and guarantees. We use the running
example of binary decision diagrams and then demonstrate the generality of our
solutions by applying them to other examples of hash-consed data structures
An accurate scheme to solve cluster dynamics equations using a Fokker-Planck approach
We present a numerical method to accurately simulate particle size
distributions within the formalism of rate equation cluster dynamics. This
method is based on a discretization of the associated Fokker-Planck equation.
We show that particular care has to be taken to discretize the advection part
of the Fokker-Planck equation, in order to avoid distortions of the
distribution due to numerical diffusion. For this purpose we use the
Kurganov-Noelle-Petrova scheme coupled with the monotonicity-preserving
reconstruction MP5, which leads to very accurate results. The interest of the
method is highlighted on the case of loop coarsening in aluminum. We show that
the choice of the models to describe the energetics of loops does not
significantly change the normalized loop distribution, while the choice of the
models for the absorption coefficients seems to have a significant impact on
it
Hybrid deterministic/stochastic algorithm for large sets of rate equations
We propose a hybrid algorithm for the time integration of large sets of rate
equations coupled by a relatively small number of degrees of freedom. A subset
containing fast degrees of freedom evolves deterministically, while the rest of
the variables evolves stochastically. The emphasis is put on the coupling
between the two subsets, in order to achieve both accuracy and efficiency. The
algorithm is tested on the problem of nucleation, growth and coarsening of
clusters of defects in iron, treated by the formalism of cluster dynamics. We
show that it is possible to obtain results indistinguishable from fully
deterministic and fully stochastic calculations, while speeding up
significantly the computations with respect to these two cases.Comment: 9 pages, 7 figure
Ipas : Interactive Phenomenological Animation of the Sea
International audienceNo current real time animation model of the sea simultaneously holds account of a heterogeneous water plane up to 10 km 2 with the local effects of breakings, winds, currents and shallow waters on wave groups, and this on all the wavelength scales, phenomena however essential so that maritime simulation could have meaning for sailors and remains physically believable for the eyes of oceanographers. We propose a new approach for the real time simulation of the sea: instead of numerically solving Navier-Stokes equations on a grid of points, we use oceanographical results both from theory and experiments for modeling autonomous entities, interacting in a multi agent system without any predefined grid. Our model ipas (Interactive Phenomenological Animation of the Sea) includes entities such as wave groups, active and passive breakings, local winds, shallow waters and currents. Some of the whole set of interactions are modeled
An Investigation Of Organizational Information Security Risk Analysis
Despite a growing number and variety of information security threats, many organizations continue to neglect implementing information security policies and procedures. The likelihood that an organization’s information systems can fall victim to these threats is known as information systems risk (Straub & Welke, 1998). To combat these threats, an organization must undergo a rigorous process of self-analysis. To better understand the current state of this information security risk analysis (ISRA) process, this study deployed a questionnaire using both open-ended and closed ended questions administered to a group of information security professionals (N=32). The qualitative and quantitative results of this study show that organizations are beginning to conduct regularly scheduled ISRA processes. However, the results also show that organizations still have room for improvement to create idyllic ISRA processes. 
The use of ion mobility spectrometry and gas chromatography/mass spectrometry for the detection of illicit drugs on clandestine records
Illicit drug distribution has over the past decade grown tremendously from simple 'drug pushing' where drugs were distributed from poorly organized individuals to today's well organized and well financed drug cartels. This change to a more 'corporate-like' atmosphere has resulted in a greater use of record keeping to monitor the profits generated. The use of record keeping by drug distributors is not restricted to high level drug smugglers but is used at all levels within the distribution network. Dealers at all levels including street dealers are generally 'fronted', given on consignment quantities of drugs that they in turn sell to customers, thereby requiring the need for records to keep track of drug sales versus liabilities. These records because of their illicit nature are often encrypted to hide the fact that they are indeed records of drug transactions. The creation of a handwritten notation concerning a drug transaction is normally brought on because of a purchase or sale. In a sale, this is commonly accomplished through a consignment, or the designation of a quantity to a customer to whom that amount has been 'fronted'. Because this activity generates a debt, it follows that an accounting for payments made, as well as new transactions completed, is only logical. One of the most common means of representing these is through an 'accounting flow', in which payments are subtracted from a running balance while new sales are added to it. The examination of illicit drug records has been the key to the prosecution of numerous federal, state, and local drug cases for a number of years. The Document Section of the FBI Laboratory, through its Racketeering Records Analysis Unit (RRAU), has been involved in such analytical efforts since 1983. Detailed analytical research brought about an evolution in the systematic approach utilized in the RRAU since that time. The close proximity of the drugs to the records often results in trace drug evidence being transferred to the records. The detection of trace drug residue on surfaces by ion mobility spectrometry (IMS) is well documented in literature. The following procedure will deal primarily with the newer techniques of trace drug analysis and drug record analysis developed by the Chemistry/Toxicology Unit of the FBI Laboratory since the more traditional techniques of latent finger print analysis and document analysis are well known
Probing punctual magnetic singularities during magnetization process in FePd films
We report the use of Lorentz microscopy to observe the domain wall structure
during the magnetization process in FePd thin foils. We have focused on the
magnetic structure of domain walls of bubble-shaped magnetic domains near
saturation. Regions are found along the domain walls where the magnetization
abruptly reverses. Multiscale magnetic simulations shown that these regions are
vertical Bloch lines (VBL) and the different bubble shapes observed are then
related to the inner structure of the VBLs. We were thus able to probe the
presence of magnetic singularities as small as Bloch points in the inner
magnetization of the domain walls
Four-dimensional Printing on Textiles Evaluating digital file-to-fabrication workflows for self-forming composite shell structures
This design-led research investigates the development of self-forming wearable composite structures by printing embossed patterns out of flexible filament on pre-stretched textiles and releasing the stress after the printing has been completed. In particular, the study presents and compares three methods of ‘file-to-fabrication’ techniques for generating self-forming textile shell structures: The first is based on modified geometrical patterns in relation to curvature analysis, the second on printed patterns related to their stress line simulation and the third on an analysis of the anisotropic shrinking behaviour of stripe patterns. The findings emphasize the advantages and challenges of each method as well as present a comparative table chart highlighting the relationship between material properties, pattern geometry and the formal vocabulary of the composite shells
A Coherent Interpretation of the Form Factors of the Nucleon in Terms of a Pion Cloud and Constituent Quarks
The recent unbiased measurements of the electric form factor of the neutron
suggest that its shape may be interpreted as a smooth broad distribution with a
bump at Q^2 \approx 0.3(GeV/c)^2 superimposed. As a consequence the
corresponding charge distribution in the Breit frame shows a negative charge
extending as far out as 2fm. It is natural to identify this charge with the
pion cloud. This realisation is then used to reanalyse all old and new data of
the electric and magnetic from factors of the proton and the neutron by a
phenomenological fit and by a fit based on the constituent quark model. It is
shown that it is possible to fit all form factors coherently with both
ansaetzen and that they all show the signal of the pion cloud.Comment: 17 pages, 17 figure
- …